High-Dimensional $L_2$Boosting: Rate of Convergence

نویسندگان

  • Ye Luo
  • Martin Spindler
چکیده

Boosting is one of the most significant developments in machine learning. This paper studies the rate of convergence of L2Boosting, which is tailored for regression, in a high-dimensional setting. Moreover, we introduce so-called “post-Boosting”. This is a post-selection estimator which applies ordinary least squares to the variables selected in the first stage by L2Boosting. Another variant is orthogonal boosting where after each step an orthogonal projection is conducted. We show that both post-L2Boosting and the orthogonal boosting achieve the same rate of convergence as Lasso in a sparse, high-dimensional setting. The “classical”L2Boosting achieves a slower convergence rate for prediction, but no assumptions on the design matrix are imposed for this result in contrast to rates e.g. established with LASSO. We also introduce rules for early stopping which can easily be implemented and will be used in applied work. Moreover, our results also allow a direct comparison between LASSO and boosting that has been missing in the literature. Finally, we present simulation studies to illustrate the relevance of our theoretical results and to provide insights into the practical aspects of boosting. In the simulation studies post-L2Boosting clearly outperforms LASSO.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

On the Rate of Convergence of Regularized Boosting Classifiers

A regularized boosting method is introduced, for which regularization is obtained through a penalization function. It is shown through oracle inequalities that this method is model adaptive. The rate of convergence of the probability of misclassification is investigated. It is shown that for quite a large class of distributions, the probability of error converges to the Bayes risk at a rate fas...

متن کامل

High Order Compact Finite Difference Schemes for Solving Bratu-Type Equations

In the present study, high order compact finite difference methods is used to solve one-dimensional Bratu-type equations numerically. The convergence analysis of the methods is discussed and it is shown that the theoretical order of the method is consistent with its numerical rate of convergence. The maximum absolute errors in the solution at grid points are calculated and it is shown that the ...

متن کامل

Boosting with the L 2 -loss: Regression and Classiication

This paper investigates a computationally simple variant of boosting, L 2 Boost, which is constructed from a functional gradient descent algorithm with the L 2-loss function. As other boosting algorithms, L 2 Boost uses many times in an iterative fashion a pre-chosen tting method, called the learner. Based on the explicit expression of reetting of residuals of L 2 Boost, the case with (symmetri...

متن کامل

Almost Sure Convergence Rates for the Estimation of a Covariance Operator for Negatively Associated Samples

Let {Xn, n >= 1} be a strictly stationary sequence of negatively associated random variables, with common continuous and bounded distribution function F. In this paper, we consider the estimation of the two-dimensional distribution function of (X1,Xk+1) based on histogram type estimators as well as the estimation of the covariance function of the limit empirical process induced by the se...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2016